1,310 research outputs found

    ELECTRONIC STRUCTURE OF GE IN SIO2

    Get PDF
    It is argued that one-electron theory is insufficient to account for the origin of the observed spectra of Ge in SiO2 ( alpha -quartz) crystals. A simple model is employed to show that impurity states responsible for ESR spectra of SiO2:Ge are stabilised by many-electron polarisation effects associated with the Ge atom itself and its immediate oxygen neighbour

    Preempting the Police

    Get PDF
    The challenge of regulating police discretion is exacerbated by the fact that a great deal of questionable police activity exists in the legal shadows—unregulated practices that do not violate defined legal limits because they have generally eluded both judicial and legislative scrutiny. Local law enforcement strategies, like the maintenance of unauthorized police DNA databases and the routine practice of initiating casual street encounters, threaten fundamental notions of a free society but have largely failed to elicit a judicial or legislative response. This Article argues that, instead of establishing a floor for impermissible police misconduct and then ceding responsibility to the legislative branch, state courts should become more interventionist—prodding legislators to provide greater guidance about police activities that they condone by forcing them to explicitly endorse questionable police practices. Accordingly, state courts should use the intrastate preemption doctrine, which holds that state law can supplant municipal authority, to find that local police officers may not engage in certain activities. Rather than stifle municipal policy innovation, a finding of preemption can precipitate a policy debate that engages both legislators and the electorate in evaluating police activity. This “information-forcing” approach can promote a more democratic dialogue about police practices, provide stronger protections for the community, and confer greater legitimacy on the police activities that legislators choose to sanction

    Estimation of Execution Parameters for k-Wave Simulations

    Get PDF
    Estimation of execution parameters takes centre stage in automatic offloading of complex biomedical workflows to cloud and high performance facilities. Since ordinary users have no or very limited knowledge of the performance characteristics of particular tasks in the workflow, the scheduling system has to have the capabilities to select appropriate amount of compute resources, e.g., compute nodes, GPUs, or processor cores and estimate the execution time and cost. The presented approach considers a fixed set of executables that can be used to create custom workflows, and collects performance data of successfully computed tasks. Since the workflows may differ in the structure and size of the input data, the execution parameters can only be obtained by searching the performance database and interpolating between similar tasks. This paper shows it is possible to predict the execution time and cost with a high confidence. If the task parameters are found in the performance database, the mean interpolation error stays below 2.29%. If only similar tasks are found, the mean interpolation error may grow up to 15%. Nevertheless, this is still an acceptable error since the cluster performance may vary on order of percent as well

    Judging Aggregate Settlement

    Get PDF
    While courts historically have taken a hands-off approach to settlement, judges across the legal spectrum have begun to intervene actively in “aggregate settlements”—repeated settlements between the same parties or institutions that resolve large groups of claims in a lockstep manner. In large-scale litigation, for example, courts have invented, without express authority, new “quasi-class action” doctrines to review the adequacy of massive settlements brokered by similar groups of attorneys. In recent and prominent agency settlements, including ones involving the SEC and EPA, courts have scrutinized the underlying merits to ensure settlements adequately reflect the interests of victims and the public at large. Even in criminal law, which has lagged behind other legal systems in acknowledging the primacy of negotiated outcomes, judges have taken additional steps to review iterant settlement decisions routinely made by criminal defense attorneys and prosecutors. Increasingly, courts intervene in settlements out of a fear commonly associated with class action negotiations—that the “aggregate” nature of the settlement process undermines the courts’ ability to promote legitimacy, loyalty, accuracy and the development of substantive law. Unfortunately, when courts step in to review the substance of settlements on their own, they may frustrate the parties’ interests, upset the separation of powers, or stretch the limits of their ability. The phenomenon of aggregate settlement thus challenges the judiciary’s duty to preserve the integrity of the civil, administrative, and criminal justice systems. This Article maps the new and critical role that courts must play in policing aggregate settlements. We argue that judicial review should exist to alert and press other institutions—private associations of attorneys, government lawyers, and the coordinate branches of government—to reform bureaucratic approaches to settling cases. Such review would not mean interfering with the final outcome of any given settlement. Rather, judicial review would instead mean demanding more information about the parties’ competing interests in settlement, more participation by outside stakeholders, and more reasoned explanations for the trade-offs made by counsel on behalf of similarly situated parties. In so doing, courts can provide an important failsafe that helps protect the procedural, substantive, and rule-of-law values threatened by aggregate settlements

    The Criminal Class Action

    Get PDF

    Modelling a New Product Model on the Basis of an Existing STEP Application Protocol

    Get PDF
    During the last years a great range of computer aided tools has been generated to support the development process of various products. The goal of a continuous data flow, needed for high efficiency, requires powerful standards for the data exchange. At the FZG (Gear Research Centre) of the Technical University of Munich there was a need for a common gear data format for data exchange between gear calculation programs. The STEP standard ISO 10303 was developed for this type of purpose, but a suitable definition of gear data was still missing, even in the Application Protocol AP 214, developed for the design process in the automotive industry. The creation of a new STEP Application Protocol or the extension of existing protocol would be a very time consumpting normative process. So a new method was introduced by FZG. Some very general definitions of an Application Protocol (here AP 214) were used to determine rules for an exact specification of the required kind of data. In this case a product model for gear units was defined based on elements of the AP 214. Therefore no change of the Application Protocol is necessary. Meanwhile the product model for gear units has been published as a VDMA paper and successfully introduced for data exchange within the German gear industry associated with FVA (German Research Organisation for Gears and Transmissions). This method can also be adopted for other applications not yet sufficiently defined by STEP.
    • 

    corecore